29 research outputs found

    Is the Web ready for HTTP/2 Server Push?

    Full text link
    HTTP/2 supersedes HTTP/1.1 to tackle the performance challenges of the modern Web. A highly anticipated feature is Server Push, enabling servers to send data without explicit client requests, thus potentially saving time. Although guidelines on how to use Server Push emerged, measurements have shown that it can easily be used in a suboptimal way and hurt instead of improving performance. We thus tackle the question if the current Web can make better use of Server Push. First, we enable real-world websites to be replayed in a testbed to study the effects of different Server Push strategies. Using this, we next revisit proposed guidelines to grasp their performance impact. Finally, based on our results, we propose a novel strategy using an alternative server scheduler that enables to interleave resources. This improves the visual progress for some websites, with minor modifications to the deployment. Still, our results highlight the limits of Server Push: a deep understanding of web engineering is required to make optimal use of it, and not every site will benefit.Comment: More information available at https://push.netray.i

    Taming Webpage Complexity to Optimize User Experience on Mobile Devices

    No full text
    Despite web access on mobile devices becoming commonplace, users continue to experience poor web performance on these devices. Traditional approaches for improving web performance face an uphill battle due to the fundamentally conflicting trends in user expectations of lower load times and richer web content. Embracing the reality that page load times will continue to be higher than user tolerance limits for the foreseeable future, we ask: How can we deliver the best possible user experience?To establish the positive correlation between rich webpage content and high webpage load times, we perform the first known measurement-driven study of the complexity of web pages and its impact on performance. To mirror a client-side view, we use browser-based measurements of over 2000 websites from 4 geo-diverse vantage points over a 3 week period. We find that we can accurately predict page load times using a handful of metrics, with the number of resources requested (content richness) being the most critical factor.Given the rising amount of webpage content and webpage load time, strategic reprioritization of content offers a parallel avenue to better the user's page load experience. To this end, we present Klotski, a system that prioritizes the content most relevant to a user's preferences. In designing Klotski, we address several challenges in: (1) accounting for inter-resource dependencies on a page; (2) enabling fast selection and load time estimation for the subset of resources to be prioritized; and (3) developing a practical implementation that requires no changes to websites. Across a range of user preference criteria, Klotski can significantly improve the user experience relative to native websites.Finally, we investigate the potential to further improve the user experience over that offered by Klotski by pushing all content on a web page to any client attempting to load a page, but find that this offers little to no improvement in page load times due to limited capabilities of the client to consume the pushed content. This result reinforces Klotski's focus on limited resource reprioritization for fixed time period goals

    Understanding Website Complexity: Measurements, Metrics, and Implications

    No full text
    Over the years, the web has evolved from simple text content from one server to a complex ecosystem with different types of content from servers spread across several administrative domains. There is anecdotal evidence of users being frustrated with high page load times or when obscure scripts cause their browser windows to freeze. Because page load times are known to directly impact user satisfaction, providers would like to understand if and how the complexity of their websites affects the user experience. While there is an extensive literature on measuring web graphs, website popularity, and the nature of web traffic, there has been little work in understanding how complex individual websites are, and how this complexity impacts the clients ’ experience. This paper is a first step to address this gap. To this end, we identify a set of metrics to characterize the complexity of websites both at a content-level (e.g., number and size of images) and service-level (e.g., number of servers/origins). We find that the distributions of these metrics are largely independent of a website’s popularity rank. However, some categories (e.g., News) are more complex than others. More than 60 % of websites have content from at least 5 non-origin sources and these contribute more than 35 % of the bytes downloaded. In addition, we analyze which metrics are most critical for predicting page render and load times and find that the number of objects requested is the most important factor. With respect to variability in load times, however, we find that the number of servers is the best indicator

    CSPAN

    No full text
    corecore